Your browser doesn't support javascript.
Show: 20 | 50 | 100
Results 1 - 12 de 12
Filter
1.
Int J Environ Res Public Health ; 19(17)2022 Sep 02.
Article in English | MEDLINE | ID: covidwho-2010032

ABSTRACT

The conversion rate between asymptomatic infections and reported/unreported symptomatic infections is a very sensitive parameter for model variables that spread COVID-19. This is important information for follow-up use in screening, prediction, prognostics, contact tracing, and drug development for the COVID-19 pandemic. The model described here suggests that there may not be enough researchers to solve all of these problems thoroughly and effectively, and it requires careful selection of what we are doing and rapid sharing of results and models and optimizing modeling simulations with value to reduce the impact of COVID-19. Exploring simulation modeling will help decision makers make the most informed decisions. In order to fight against the "Delta" virus, the establishment of a line of defense through all-people testing (APT) is not only an effective method summarized from past experience but also one of the best means to effectively cut the chain of epidemic transmission. The effect of large-scale testing has been fully verified in the international community. We developed a practical dynamic infectious disease model-SETPG (A + I) RD + APT by considering the effects of the all-people test (APT). The model is useful for studying effects of screening measures and providing a more realistic modelling with all-people-test strategies, which require everybody in a population to be tested for infection. In prior work, a total of 370 epidemic cases were collected. We collected three kinds of known cases: the cumulative number of daily incidences, daily cumulative recovery, and daily cumulative deaths in Hong Kong and the United States between 22 January 2020 and 13 November 2020 were simulated. In two essential strategies of the integrated SETPG (A + I) RD + APT model, comparing the cumulative number of screenings in derivative experiments based on daily detection capability and tracking system application rate, we evaluated the performance of the timespan required for the basic regeneration number (R0) and real-time regeneration number (R0t) to reach 1; the optimal policy of each experiment is available, and the screening effect is evaluated by screening performance indicators. with the binary encoding screening method, the number of screenings for the target population is 8667 in HK and 1,803,400 in the U.S., including 6067 asymptomatic cases in HK and 1,262,380 in the U.S. as well as 2599 cases of mild symptoms in HK and 541,020 in the U.S.; there were also 8.25 days of screening timespan in HK and 9.25 days of screening timespan required in the U.S. and a daily detectability of 625,000 cases in HK and 6,050,000 cases in the U.S. Using precise tracking technology, number of screenings for the target population is 6060 cases in HK and 1,766,420 cases in the U.S., including 4242 asymptomatic cases in HK and 1,236,494 cases in the U.S. as well as 1818 cases of mild symptoms in HK and 529,926 cases in the U.S. Total screening timespan (TS) is 8.25~9.25 days. According to the proposed infectious dynamics model that adapts to the all-people test, all of the epidemic cases were reported for fitting, and the result seemed more reasonable, and epidemic prediction became more accurate. It adapted to densely populated metropolises for APT on prevention.


Subject(s)
COVID-19 , Communicable Diseases , COVID-19/diagnosis , COVID-19/epidemiology , COVID-19 Testing , Communicable Diseases/epidemiology , Humans , Pandemics/prevention & control , SARS-CoV-2 , United States
2.
Future Generation Computer Systems ; 2022.
Article in English | ScienceDirect | ID: covidwho-1719763

ABSTRACT

In the medical domain, data are often collected over time, evolving from simple to refined categories. The data and the underlying structures of the medical data as to how they have grown to today’s complexity can be decomposed into crude forms when data collection starts. For instance, the cancer dataset is labeled either benign or malignant at its simplest or perhaps the earliest form. As medical knowledge advances and/or more data become available, the dataset progresses from binary class to multi-class, having more labels of sub-categories of the disease added. In machine learning, inducing a multi-class model requires more computational power. Model optimization is enforced over the multi-class models for the highest possible accuracy, which of course, is necessary for life-and-death decision making. This model optimization task consumes an extremely long model training time. In this paper, a novel strategy called Group-of-Single-Class prediction (GOSC) coupled with majority voting and model transfer is proposed for achieving maximum accuracy by using only a fraction of the model training time. The main advantage is the ability to achieve an optimized multi-class classification model that has the highest possible accuracy near to the absolute maximum, while the training time could be saved by up to 70%. Experiments on machine learning over liver dataset classification and deep learning over COVID19 lung CT images were tested. Preliminary results suggest the feasibility of this new approach.

3.
Int J Biol Sci ; 17(6): 1581-1587, 2021.
Article in English | MEDLINE | ID: covidwho-1206429

ABSTRACT

Artificial intelligence (AI) is being used to aid in various aspects of the COVID-19 crisis, including epidemiology, molecular research and drug development, medical diagnosis and treatment, and socioeconomics. The association of AI and COVID-19 can accelerate to rapidly diagnose positive patients. To learn the dynamics of a pandemic with relevance to AI, we search the literature using the different academic databases (PubMed, PubMed Central, Scopus, Google Scholar) and preprint servers (bioRxiv, medRxiv, arXiv). In the present review, we address the clinical applications of machine learning and deep learning, including clinical characteristics, electronic medical records, medical images (CT, X-ray, ultrasound images, etc.) in the COVID-19 diagnosis. The current challenges and future perspectives provided in this review can be used to direct an ideal deployment of AI technology in a pandemic.


Subject(s)
Artificial Intelligence , COVID-19/diagnosis , COVID-19/virology , COVID-19 Testing/methods , Humans , Machine Learning , SARS-CoV-2/isolation & purification
4.
Phys Biol ; 18(4)2021 05 28.
Article in English | MEDLINE | ID: covidwho-1192595

ABSTRACT

In this paper, we demonstrate the application of MATLAB to develop a pandemic prediction system based on Simulink. The susceptible-exposed-asymptomatic but infectious-symptomatic and infectious (severe infected population + mild infected population)-recovered-deceased (SEAI(I1+I2)RD) physical model for unsupervised learning and two types of supervised learning, namely, fuzzy proportional-integral-derivative (PID) and wavelet neural-network PID learning, are used to build a predictive-control system model that enables self-learning artificial intelligence (AI)-based control. After parameter setting, the data entering the model are predicted, and the value of the data set at a future moment is calculated. PID controllers are added to ensure that the system does not diverge at the beginning of iterative learning. To adapt to complex system conditions and afford excellent control, a wavelet neural-network PID control strategy is developed that can be adjusted and corrected in real time, according to the output error.


Subject(s)
COVID-19/epidemiology , Computer Simulation , Models, Biological , COVID-19/transmission , Deep Learning , Fuzzy Logic , Humans , India/epidemiology , Neural Networks, Computer , Nonlinear Dynamics , Pandemics , SARS-CoV-2/physiology , United States/epidemiology
5.
Appl Intell (Dordr) ; 51(7): 4162-4198, 2021.
Article in English | MEDLINE | ID: covidwho-1009153

ABSTRACT

Measuring the spread of disease during a pandemic is critically important for accurately and promptly applying various lockdown strategies, so to prevent the collapse of the medical system. The latest pandemic of COVID-19 that hits the world death tolls and economy loss very hard, is more complex and contagious than its precedent diseases. The complexity comes mostly from the emergence of asymptomatic patients and relapse of the recovered patients which were not commonly seen during SARS outbreaks. These new characteristics pertaining to COVID-19 were only discovered lately, adding a level of uncertainty to the traditional SEIR models. The contribution of this paper is that for the COVID-19 epidemic, which is infectious in both the incubation period and the onset period, we use neural networks to learn from the actual data of the epidemic to obtain optimal parameters, thereby establishing a nonlinear, self-adaptive dynamic coefficient infectious disease prediction model. On the basis of prediction, we considered control measures and simulated the effects of different control measures and different strengths of the control measures. The epidemic control is predicted as a continuous change process, and the epidemic development and control are integrated to simulate and forecast. Decision-making departments make optimal choices. The improved model is applied to simulate the COVID-19 epidemic in the United States, and by comparing the prediction results with the traditional SEIR model, SEAIRD model and adaptive SEAIRD model, it is found that the adaptive SEAIRD model's prediction results of the U.S. COVID-19 epidemic data are in good agreement with the actual epidemic curve. For example, from the prediction effect of these 3 different models on accumulative confirmed cases, in terms of goodness of fit, adaptive SEAIRD model (0.99997) ≈ SEAIRD model (0.98548) > Classical SEIR model (0.66837); in terms of error value: adaptive SEAIRD model (198.6563) < < SEAIRD model(4739.8577) < < Classical SEIR model (22,652.796); The objective of this contribution is mainly on extending the current spread prediction model. It incorporates extra compartments accounting for the new features of COVID-19, and fine-tunes the new model with neural network, in a bid of achieving a higher level of prediction accuracy. Based on the SEIR model of disease transmission, an adaptive model called SEAIRD with internal source and isolation intervention is proposed. It simulates the effects of the changing behaviour of the SARS-CoV-2 in U.S. Neural network is applied to achieve a better fit in SEAIRD. Unlike the SEIR model, the adaptive SEAIRD model embraces multi-group dynamics which lead to different evolutionary trends during the epidemic. Through the risk assessment indicators of the adaptive SEAIRD model, it is convenient to measure the severity of the epidemic situation for consideration of different preventive measures. Future scenarios are projected from the trends of various indicators by running the adaptive SEAIRD model.

6.
Artificial Intelligence for Coronavirus Outbreak ; : 23-45, 2020.
Article in English | PMC | ID: covidwho-825061

ABSTRACT

Development of innovative designs, new applications, new technologies and heavier investment in AI are continued to be seen every day. However, with the sudden impact of COVID19, so severe and urgent around the world, adoption of AI is propelled to an unprecedent level, because it helps to fight the virus pandemic by enabling one or more of the following possibilities: (1) autonomous everything, (2) pervasive knowledge, (3) assistive technology and (4) rational decision support.

7.
Artificial Intelligence for Coronavirus Outbreak ; : 1-22, 2020.
Article in English | PMC | ID: covidwho-825060

ABSTRACT

A novel coronavirus (CoV) named ‘2019-nCoV’ or ‘2019 novel coronavirus’ or ‘COVID-19’ by the World Health Organization (WHO) is in charge of the current outbreak of pneumonia that began at the beginning of December 2019 near in Wuhan City, Hubei Province, China [1–4]. COVID-19 is a pathogenic virus. From the phylogenetic analysis carried out with obtainable full genome sequences, bats occur to be the COVID-19 virus reservoir, but the intermediate host(s) has not been detected till now.

8.
Cognit Comput ; 12(5): 1011-1023, 2020.
Article in English | MEDLINE | ID: covidwho-716405

ABSTRACT

The coronavirus disease (COVID-19) caused by a novel coronavirus, SARS-CoV-2, has been declared a global pandemic. Due to its infection rate and severity, it has emerged as one of the major global threats of the current generation. To support the current combat against the disease, this research aims to propose a machine learning-based pipeline to detect COVID-19 infection using lung computed tomography scan images (CTI). This implemented pipeline consists of a number of sub-procedures ranging from segmenting the COVID-19 infection to classifying the segmented regions. The initial part of the pipeline implements the segmentation of the COVID-19-affected CTI using social group optimization-based Kapur's entropy thresholding, followed by k-means clustering and morphology-based segmentation. The next part of the pipeline implements feature extraction, selection, and fusion to classify the infection. Principle component analysis-based serial fusion technique is used in fusing the features and the fused feature vector is then employed to train, test, and validate four different classifiers namely Random Forest, K-Nearest Neighbors (KNN), Support Vector Machine with Radial Basis Function, and Decision Tree. Experimental results using benchmark datasets show a high accuracy (> 91%) for the morphology-based segmentation task; for the classification task, the KNN offers the highest accuracy among the compared classifiers (> 87%). However, this should be noted that this method still awaits clinical validation, and therefore should not be used to clinically diagnose ongoing COVID-19 infection.

9.
Appl Soft Comput ; 93: 106282, 2020 Aug.
Article in English | MEDLINE | ID: covidwho-185220

ABSTRACT

In the advent of the novel coronavirus epidemic since December 2019, governments and authorities have been struggling to make critical decisions under high uncertainty at their best efforts. In computer science, this represents a typical problem of machine learning over incomplete or limited data in early epidemic Composite Monte-Carlo (CMC) simulation is a forecasting method which extrapolates available data which are broken down from multiple correlated/casual micro-data sources into many possible future outcomes by drawing random samples from some probability distributions. For instance, the overall trend and propagation of the infested cases in China are influenced by the temporal-spatial data of the nearby cities around the Wuhan city (where the virus is originated from), in terms of the population density, travel mobility, medical resources such as hospital beds and the timeliness of quarantine control in each city etc. Hence a CMC is reliable only up to the closeness of the underlying statistical distribution of a CMC, that is supposed to represent the behaviour of the future events, and the correctness of the composite data relationships. In this paper, a case study of using CMC that is enhanced by deep learning network and fuzzy rule induction for gaining better stochastic insights about the epidemic development is experimented. Instead of applying simplistic and uniform assumptions for a MC which is a common practice, a deep learning-based CMC is used in conjunction of fuzzy rule induction techniques. As a result, decision makers are benefited from a better fitted MC outputs complemented by min-max rules that foretell about the extreme ranges of future possibilities with respect to the epidemic.

10.
Non-conventional in English | WHO COVID | ID: covidwho-610788
11.
Non-conventional in English | WHO COVID | ID: covidwho-610571
12.
Non-conventional in English | WHO COVID | ID: covidwho-3177

ABSTRACT

Epidemic is a rapid and wide spread of infectious disease threatening many lives and economy damages. It is important to fore-tell the epidemic lifetime so to decide on timely and remedic actions. These measures include closing borders, schools, suspending community services and commuters. Resuming such curfews depends on the momentum of the outbreak and its rate of decay. Being able to accurately forecast the fate of an epidemic is an extremely important but difficult task. Due to limited knowledge of the novel disease, the high uncertainty involved and the complex societal-political factors that influence the widespread of the new virus, any forecast is anything but reliable. Another factor is the insufficient amount of available data. Data samples are often scarce when an epidemic just started. With only few training samples on hand, finding a forecasting model which offers forecast at the best efforts is a big challenge in machine learning. In the past, three popular methods have been proposed, they include 1) augmenting the existing little data, 2) using a panel selection to pick the best forecasting model from several models, and 3) fine-tuning the parameters of an individual forecasting model for the highest possible accuracy. In this paper, a methodology that embraces these three virtues of data mining from a small dataset is proposed. An experiment that is based on the recent coronavirus outbreak originated from Wuhan is conducted by applying this methodology. It is shown that an optimized forecasting model that is constructed from a new algorithm, namely polynomial neural network with corrective feedback (PNN+cf) is able to make a forecast that has relatively the lowest prediction error. The results showcase that the newly proposed methodology and PNN+cf are useful in generating acceptable forecast upon the critical time of disease outbreak when the samples are far from abundant.

SELECTION OF CITATIONS
SEARCH DETAIL